Back
Tags: #machine learning
-
Implementing BatchNorm in Neural Net
BatchNorm is a relatively new technique for training neural net. It gaves us a lot of relaxation when initializing the ...
-
Implementing Dropout in Neural Net
Dropout is one simple way to regularize a neural net model. This is one of the recent advancements in Deep ...
-
Beyond SGD: Gradient Descent with Momentum and Adaptive Learning Rate
There are many attempts to improve Gradient Descent: some add momentum, some add adaptive learning rate. Let's see what's out ...
-
Implementing Minibatch Gradient Descent for Neural Networks
Let's use Python and Numpy to implement Minibatch Gradient Descent algorithm for a simple 3-layers Neural Networks.
-
Paralellizing Monte Carlo Simulation in Python
Monte Carlo simulation is all about quantity. It can take a long time to complete. Here's how to speed it ...
-
Slice Sampling
An implementation example of Slice Sampling for a special case: unimodal distribution with known inverse PDF
-
Rejection Sampling
Rejection is always painful, but it's for the greater good! You can sample from a complicated distribution by rejecting samples!
-
Metropolis-Hastings
An implementation example of Metropolis-Hastings algorithm in Python.
-
Gibbs Sampling
Example of Gibbs Sampling implementation in Python to sample from a Bivariate Gaussian.